skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Turton, Terece L"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The explosive growth in supercomputers capacity has changed simulation paradigms. Simulations have shifted from a few lengthy ones to an ensemble of multiple simulations with varying initial conditions or input parameters. Thus, an ensemble consists of large volumes of multi-dimensional data that could go beyond the exascale boundaries. However, the disparity in growth rates between storage capabilities and computing resources results in I/O bottlenecks. This makes it impractical to utilize conventional postprocessing and visualization tools for analyzing such massive simulation ensembles. In situ visualization approaches alleviate I/O constraints by saving predetermined visualizations in image databases during simulation. Nevertheless, the unavailability of output raw data restricts the flexibility of post hoc exploration of in situ approaches. Much research has been conducted to mitigate this limitation, but it falls short when it comes to simultaneously exploring and analyzing parameter and ensemble spaces. In this paper, we propose an expert-in-the-loop visual exploration analytic approach. The proposed approach leverages: feature extraction, deep learning, and human expert–AI collaboration techniques to explore and analyze image-based ensembles. Our approach utilizes local features and deep learning techniques to learn the image features of ensemble members. The extracted features are then combined with simulation input parameters and fed to the visualization pipeline for in-depth exploration and analysis using human expert + AI interaction techniques. We show the effectiveness of our approach using several scientific simulation ensembles. 
    more » « less
  2. A significant challenge on an exascale computer is the speed at which we compute results exceeds by many orders of magnitude the speed at which we save these results. Therefore the Exascale Computing Project (ECP) ALPINE project focuses on providing exascale-ready visualization solutions including in situ processing. In situ visualization and analysis runs as the simulation is run, on simulations results are they are generated avoiding the need to save entire simulations to storage for later analysis. The ALPINE project made post hoc visualization tools, ParaView and VisIt, exascale ready and developed in situ algorithms and infrastructures. The suite of ALPINE algorithms developed under ECP includes novel approaches to enable automated data analysis and visualization to focus on the most important aspects of the simulation. Many of the algorithms also provide data reduction benefits to meet the I/O challenges at exascale. ALPINE developed a new lightweight in situ infrastructure, Ascent. 
    more » « less